Tks a lot for the compliments! In terms of pros over an RPi, I would say a higher number o USB 3.0, a dedicated GPU (some ros libraries take advantage of that), and most important is the acceptance by the community that led to a whole bunch of topics and solutions using the Jetson Nano in more complex robots.
it s a good project to get busy with I m interested in knowing if we can use the lidar with an Arduino and can you please give me more information about the navigation system that you are using
Yes, you can! That is the best thing about robotics ahahaha. Depending on the interface you can use an Arduino, but normally LiDAR comes with USB or another type of interface. Do you have any specific questions about the navigation?
Hi, as many Videomakers, I’d need an AI video operator. Do you think it’s possible to implement face tracking on such a car? Maybe there already are commercial products like this? I know of dozens of face tracking flying drones, but no ground drones… thanks!
Not sure if there is a terrestrial face tracking platform commercially available. You can develop something like this, but it probably would require some stabilization in the camera to achieve better results.
Still there my friend? I hope so. DIASBLED , I AM MAKING AN RC GRASS MOWER. I USED ANOTHER VIDEO TO LIST PARTS. I WILL USE MORE BATTERY AND LARGER MOTORS. HOWEVER,I KNOW THIS CAN BE (AUTOMATED). BRAND HUSQVARNA AUTO MOWERS HAVE THE TECH. BUT PRICES ARE THOUSANDS OF $$. I am using small gas motor push mower. I noticed they use GPS REFERENCE POINT ON PROPERTY. YOU HAVE LIDAR. I WILL NEED BOTH, AVIOD OBJECTS AND MAP LOCATION. (MISSING IS CAMERA FOR SAFETY TO DETECT HUMANS AND SHUT DOWN. I THINK I CAN SET LIDAR TO 20MM CLOSE TO OBJECT BUT I NEED COMMAND TO GO AROUND OBJECT CLOSELY. NEXT PROGRAM MOWER TO TURN ONE SIDE OF WHEELS TO CUT NEXT LINE (NOT MISS CUT GRASS). CAN NANO BE PROGRAMMED ? OR CAMERA VIEW GRASS?
Hello, Can you help me in driving the autonomous car in between the detected road lines. As lines will turn car will turn too. Please refer me simple technique to control the steering angle. Thank you!
I know that some projects use color space threshold to filter out the lines out of the input image. Then, you can use some metric to predict where the lines that you see can go and move your robot along them. Check out this link: medium.com/computer-car/udacity-self-driving-car-nanodegree-project-1-finding-lane-lines-9cd6a846c58c
@@lokesh_jadhao4383 yes you can. You will need some manual conversion depending on how you localize your robot within the map. Check out this thread: answers.ros.org/question/286317/using-google-maps-to-tell-the-robot-where-to-go/
Keep going,, your the ONLY one doing this,,,
Excellent! been looking for a detailed build to follow along with. Everyone seems to only do a synopsis
Nice video
Im interested in part 3!
Awesome bro, waiting for part 3,4,5
Awesome tutorial... when do you plan to release Part 3,4, and 5? Good work keep it up!
never :-)
Great Job,man! when are you planning ot release other parts? Is 3rd part ready?
looking forward to the third part
I im Waiting for the next parts
thank you so much that really helped me finishing my sophisticated project?thank you again, keep upp chaamppp
awesome waiting next part
Please Part 3!!!
Waiting for Part 3 !
We want Part 3!!
great projet, its is very nice, thank for the video
wooow!
I really like it!! will you do part 3 anytime soon?
Subscribed
Hey man, really nice project! May I ask why exactly you've chosen the jetson nano controller? Any pros/cons over a "regular" raspberry pi?
Tks a lot for the compliments! In terms of pros over an RPi, I would say a higher number o USB 3.0, a dedicated GPU (some ros libraries take advantage of that), and most important is the acceptance by the community that led to a whole bunch of topics and solutions using the Jetson Nano in more complex robots.
@@Brain_Robotics thanks alot!
Could we have the part 3 and 4?
it s a good project to get busy with I m interested in knowing if we can use the lidar with an Arduino and can you please give me more information about the navigation system that you are using
Yes, you can! That is the best thing about robotics ahahaha. Depending on the interface you can use an Arduino, but normally LiDAR comes with USB or another type of interface. Do you have any specific questions about the navigation?
Wow nice job. Its easy to do now with chatgpt. Hows your project going?
Did you set up the vesc using Vesc Tool application and then install ROS vesc package?
Still no part 3? They were great videos
yeah
Hey, could you help me building an autonomous car which drives through an office?
Hi, as many Videomakers, I’d need an AI video operator. Do you think it’s possible to implement face tracking on such a car? Maybe there already are commercial products like this? I know of dozens of face tracking flying drones, but no ground drones… thanks!
Not sure if there is a terrestrial face tracking platform commercially available. You can develop something like this, but it probably would require some stabilization in the camera to achieve better results.
No part 3?
When will come next video please upload fast
Still there my friend? I hope so. DIASBLED , I AM MAKING AN RC GRASS MOWER. I USED ANOTHER VIDEO TO LIST PARTS. I WILL USE MORE BATTERY AND LARGER MOTORS.
HOWEVER,I KNOW THIS CAN BE (AUTOMATED). BRAND HUSQVARNA AUTO MOWERS HAVE THE TECH. BUT PRICES ARE THOUSANDS OF $$. I am using small gas motor push mower.
I noticed they use GPS REFERENCE POINT ON PROPERTY. YOU HAVE LIDAR.
I WILL NEED BOTH, AVIOD OBJECTS AND MAP LOCATION.
(MISSING IS CAMERA FOR SAFETY TO DETECT HUMANS AND SHUT DOWN.
I THINK I CAN SET LIDAR TO 20MM CLOSE TO OBJECT BUT I NEED COMMAND TO GO AROUND OBJECT CLOSELY.
NEXT PROGRAM MOWER TO TURN ONE SIDE OF WHEELS TO CUT NEXT LINE (NOT MISS CUT GRASS).
CAN NANO BE PROGRAMMED ? OR CAMERA VIEW GRASS?
Won't it continue?
Bro part 3
Hi, How can I contact you? I would like to jet details about the parts and costs etc
There is a link of the GitHub repo in the description. In the repo I put the parts and cost.
Hello,
Can you help me in driving the autonomous car in between the detected road lines. As lines will turn car will turn too. Please refer me simple technique to control the steering angle.
Thank you!
I know that some projects use color space threshold to filter out the lines out of the input image. Then, you can use some metric to predict where the lines that you see can go and move your robot along them. Check out this link: medium.com/computer-car/udacity-self-driving-car-nanodegree-project-1-finding-lane-lines-9cd6a846c58c
3 part
Hey bro I have one question ???
Can we give Google map direction to RC car??
@@lokesh_jadhao4383 yes you can. You will need some manual conversion depending on how you localize your robot within the map. Check out this thread:
answers.ros.org/question/286317/using-google-maps-to-tell-the-robot-where-to-go/
part 3 :(
Tu nous as ken
) ok p
That controller choice didn't age so well......